17 research outputs found

    Design of a Humanoid Robot Eye

    Get PDF
    This chapter addresses the design of a robot eye featuring the mechanics and motion characteristics of a human one. In particular the goal is to provide guidelines for the implementation of a tendon driven robot capable to emulate saccadic motions. In the first part of this chapter the physiological and mechanical characteristics of the eyeplant1 in humans and primates will be reviewed. Then, the fundamental motion strategies used by humans during saccadic motions will be discussed, and the mathematical formulation of the relevant Listing\u2019s Law and Half-Angle Rule, which specify the geometric and kinematic characteristics of ocular saccadic motions, will be introduced

    Design of a Tactile Sensor for Robot Hands

    Get PDF

    Chapter Large Scale Capacitive Skin for Robots

    Get PDF
    Communications engineering / telecommunication

    The design and validation of the R1 personal humanoid

    Get PDF
    In recent years the robotics field has witnessed an interesting new trend. Several companies started the production of service robots whose aim is to cooperate with humans. The robots developed so far are either rather expensive or unsuitable for manipulation tasks. This article presents the result of a project which wishes to demonstrate the feasibility of an affordable humanoid robot. R1 is able to navigate, and interact with the environment (grasping and carrying objects, operating switches, opening doors etc). The robot is also equipped with a speaker, microphones and it mounts a display in the head to support interaction using natural channels like speech or (simulated) eye movements. The final cost of the robot is expected to range around that of a family car, possibly, when produced in large quantities, even significantly lower. This goal was tackled along three synergistic directions: use of polymeric materials, light-weight design and implementation of novel actuation solutions. These lines, as well as the robot with its main features, are described hereafter

    Combining Sensors Information to Enhance Pneumatic Grippers Performance

    No full text
    The gripper is the far end of a robotic arm. It is responsible for the contacts between the robot itself and all the items present in a work space, or even in a social space. Therefore, to provide grippers with intelligent behaviors is fundamental, especially when the robot has to interact with human beings. As shown in this article, we built an instrumented pneumatic gripper prototype that relies on different sensors’ information. Thanks to such information, the gripper prototype was able to detect the position of a given object in order to grasp it, to safely keep it between its fingers and to avoid slipping in the case of any object movement, even very small. The gripper performance was evaluated by means of a generic grasping algorithm for robotic grippers, implemented in the form of a state machine. Several slip tests were carried out on the pneumatic gripper, which showed a very fast response time and high reliability. Objects of various size, shape and hardness were employed to reproduce different grasping scenarios. We demonstrate that, through the use of force, torque, center of pressure and proximity information, the behavior of the developed pneumatic gripper prototype outperforms the one of the traditional pneumatic gripping devices

    An articulated talking face for the iCub

    No full text
    International audienceRecent developments in human-robot interaction show how the ability to communicate with people in a natural way is of great importance for artificial agents. The implementation of facial expressions has been found to significantly increase the interaction capabilities of humanoid robots. For speech, displaying a correct articulation with sound is mandatory to avoid audiovisual illusions like the McGurk effect (leading to comprehension errors) as well as to enhance the intelligibility in noise. This work describes the design, construction and testing of an animatronic talking face developed for the iCub robot. This talking head has an articulated jaw and four independent lip movements actuated by five motors. It is covered by a specially designed elastic tissue cover whose hemlines at the lips are attached to the motors via connecting linkages

    Design and Validation of a Talking Face for the iCub

    No full text
    International audienceRecent developments in human–robot interaction show how the ability to communicate with people in a natural way is of great importance for artificial agents. The implementation of facial expressions has been found to significantly increase the interaction capabilities of humanoid robots. For speech, displaying a correct articulation with sound is mandatory to avoid audiovisual illusions like the McGurk effect (leading to comprehension errors) as well as to enhance the intelligibility in noisy conditions. This work describes the design, construction and testing of an animatronic talking face developed for the iCub robot. This talking head has an articulated jaw and four independent lip movements actuated by five motors. It is covered by a specially designed elastic tissue cover whose hemlines at the lips are attached to the motors via connecting linkages. The mechanical design and the control scheme have been evaluated by speech intelligibility in noise (SPIN) perceptual tests that demonstrate an absolute 10% intelligibility gain provided by the jaw and lip movements over the audio-only display

    A Tactile Sensor for the Fingertips of the Humanoid Robot iCub

    No full text
    In order to successfully perform object manipulation, humanoid robots must be equipped with tactile sensors. However, the limited space that is available in robotic fingers imposes severe design constraints. In [1] we presented a small prototype fingertip which incorporates a capacitive pressure system. This paper shows an improved version, which has been integrated on the hand of the humanoid robot iCub. The fingertip is 14.5 mm long and 13 mm wide. The capacitive pressure sensor system has 12 sensitive zones and includes the electronics to send the 12 measurements over a serial bus with only 4 wires. Each synthetic fingertip is shaped approximately like a human fingertip. Furthermore, an integral part of the capacitive sensor is soft silicone foam, and therefore the fingertip is compliant. We describe the structure of the fingertip, their integration on the humanoid robot iCub and present test results to show the characteristics of the sensor

    Dynamic Control of a Rigid Pneumatic Gripper

    No full text
    corecore